2025-11-13 08:32:42,989 [ 98684 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse (runner:53, check_args_and_update_paths) 2025-11-13 08:32:42,989 [ 98684 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:79, check_args_and_update_paths) 2025-11-13 08:32:42,989 [ 98684 ] INFO : utils dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/utils (runner:90, check_args_and_update_paths) 2025-11-13 08:32:42,989 [ 98684 ] INFO : base_configs_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:92, check_args_and_update_paths) clickhouse_integration_tests_volume Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_hlyj5n --privileged --dns-search='.' --memory=30709030912 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e CLICKHOUSE_USE_OLD_ANALYZER=1 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:226bfaf75ac1 '. Start tests ============================= test session starts ============================== platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /usr/bin/python3 cachedir: .pytest_cache Test order randomisation NOT enabled. Enable with --random-order or --random-order-bucket= rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: timeout-2.3.1, repeat-0.9.3, order-1.0.0, reportlog-0.4.0, xdist-3.5.0, random-order-1.1.1 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [5 items] scheduling tests via LoadFileScheduling test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_database_delta/test.py::test_complex_table_schema [gw1] [ 20%] SKIPPED test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active [gw2] [ 40%] FAILED test_backup_restore_on_cluster/test_different_versions.py::test_different_versions [gw0] [ 60%] FAILED test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables [gw0] [ 80%] FAILED test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables [gw0] [100%] FAILED test_database_delta/test.py::test_multiple_schemes_tables =================================== FAILURES =================================== ___________________________ test_different_versions ____________________________ [gw2] linux -- Python 3.10.12 /usr/bin/python3 def test_different_versions(): new_node.query( "CREATE TABLE tbl" " ON CLUSTER 'cluster_ver'" " (x UInt64) ENGINE=ReplicatedMergeTree('/clickhouse/tables/tbl/', '{replica}')" " ORDER BY tuple()" ) new_node.query(f"INSERT INTO tbl VALUES (1)") old_node.query(f"INSERT INTO tbl VALUES (2)") backup_name = new_backup_name() initiator = random_node() print(f"Using {get_node_name(initiator)} as initiator for BACKUP") initiator.query(f"BACKUP TABLE tbl ON CLUSTER 'cluster_ver' TO {backup_name}") new_node.query("DROP TABLE tbl ON CLUSTER 'cluster_ver' SYNC") initiator = random_node() print(f"Using {get_node_name(initiator)} as initiator for RESTORE") initiator.query(f"RESTORE TABLE tbl ON CLUSTER 'cluster_ver' FROM {backup_name}") new_node.query("SYSTEM SYNC REPLICA ON CLUSTER 'cluster_ver' tbl") assert new_node.query("SELECT * FROM tbl ORDER BY x") == TSV([1, 2]) assert old_node.query("SELECT * FROM tbl ORDER BY x") == TSV([1, 2]) # Error NO_ELEMENTS_IN_CONFIG is unrelated. > assert ( new_node.query( "SELECT name, last_error_message FROM system.errors WHERE NOT (" "(name == 'NO_ELEMENTS_IN_CONFIG')" ")" ) == "" ) E assert "NETLINK_ERROR\tCan\\'t receive Netlink response: error -2\n" == '' E + NETLINK_ERROR Can\'t receive Netlink response: error -2 test_backup_restore_on_cluster/test_different_versions.py:105: AssertionError ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-11-13 08:32:49.815000 [ 649 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:49.834000 [ 649 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:49.835000 [ 649 ] DEBUG : No running containers (conftest.py:95, cleanup_environment) 2025-11-13 08:32:49.835000 [ 649 ] DEBUG : Pruning Docker networks (conftest.py:97, cleanup_environment) 2025-11-13 08:32:49.835000 [ 649 ] DEBUG : Command:[docker network prune --force] (cluster.py:121, run_and_check) 2025-11-13 08:32:49.861000 [ 649 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:121, run_and_check) 2025-11-13 08:32:49.863000 [ 649 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:145, run_and_check) 2025-11-13 08:32:49.864000 [ 649 ] INFO : Running tests in /ClickHouse/tests/integration/test_backup_restore_on_cluster/test_different_versions.py (cluster.py:2738, start) 2025-11-13 08:32:49.864000 [ 649 ] DEBUG : Cluster start called. is_up=False (cluster.py:2745, start) 2025-11-13 08:32:49.887000 [ 649 ] DEBUG : Docker networks for project roottestbackuprestoreonclusterdifferentversions-gw2 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:49.910000 [ 649 ] DEBUG : Docker containers for project roottestbackuprestoreonclusterdifferentversions-gw2 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:49.931000 [ 649 ] DEBUG : Docker volumes for project roottestbackuprestoreonclusterdifferentversions-gw2 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:49.931000 [ 649 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:32:49.956000 [ 649 ] DEBUG : Docker networks for project roottestbackuprestoreonclusterdifferentversions-gw2 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:49.977000 [ 649 ] DEBUG : Docker containers for project roottestbackuprestoreonclusterdifferentversions-gw2 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:50.003000 [ 649 ] DEBUG : Docker volumes for project roottestbackuprestoreonclusterdifferentversions-gw2 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:50.003000 [ 649 ] DEBUG : Command:[docker container list --all --filter name='^/roottestbackuprestoreonclusterdifferentversions-gw2-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:32:50.033000 [ 649 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:32:50.034000 [ 649 ] DEBUG : No running containers for project: roottestbackuprestoreonclusterdifferentversions-gw2 (cluster.py:879, cleanup) 2025-11-13 08:32:50.034000 [ 649 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:32:50.065000 [ 649 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:32:50.065000 [ 649 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:32:50.102000 [ 649 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-11-13 08:32:50.102000 [ 649 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-11-13 08:32:50.102000 [ 649 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:32:50.102000 [ 649 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:50.125000 [ 649 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:50.125000 [ 649 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) 2025-11-13 08:32:50.126000 [ 649 ] DEBUG : Setup directory for instance: new_node (cluster.py:2758, start) 2025-11-13 08:32:50.127000 [ 649 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-11-13 08:32:50.127000 [ 649 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-11-13 08:32:50.128000 [ 649 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-11-13 08:32:50.129000 [ 649 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-11-13 08:32:50.130000 [ 649 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/backups_disk.xml', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/cluster_different_versions.xml'] to /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/configs/config.d (cluster.py:4741, create_dir) 2025-11-13 08:32:50.130000 [ 649 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/database (cluster.py:4758, create_dir) 2025-11-13 08:32:50.131000 [ 649 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/logs (cluster.py:4769, create_dir) 2025-11-13 08:32:50.131000 [ 649 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-11-13 08:32:50.131000 [ 649 ] INFO : external_dir_abs_path=/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/backups (cluster.py:4879, create_dir) 2025-11-13 08:32:50.131000 [ 649 ] DEBUG : Setup directory for instance: old_node (cluster.py:2758, start) 2025-11-13 08:32:50.132000 [ 649 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-11-13 08:32:50.132000 [ 649 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-11-13 08:32:50.133000 [ 649 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-11-13 08:32:50.133000 [ 649 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-11-13 08:32:50.134000 [ 649 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/backups_disk.xml', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/cluster_different_versions.xml'] to /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/configs/config.d (cluster.py:4741, create_dir) 2025-11-13 08:32:50.134000 [ 649 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/database (cluster.py:4758, create_dir) 2025-11-13 08:32:50.134000 [ 649 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/logs (cluster.py:4769, create_dir) 2025-11-13 08:32:50.134000 [ 649 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-11-13 08:32:50.134000 [ 649 ] INFO : external_dir_abs_path=/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/backups (cluster.py:4879, create_dir) 2025-11-13 08:32:50.135000 [ 649 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw', 'keeper_binary': '/clickhouse', 'keeper_cmd_prefix': 'clickhouse keeper', 'image': 'altinityinfra/integration-test:5ccda723c1fc', 'user': '0', 'keeper_fs': 'bind', 'keeper_logs_dir1': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper1/log', 'keeper_config_dir1': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper1/config', 'keeper_db_dir1': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper1/coordination', 'keeper_logs_dir2': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper2/log', 'keeper_config_dir2': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper2/config', 'keeper_db_dir2': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper2/coordination', 'keeper_logs_dir3': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper3/log', 'keeper_config_dir3': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper3/config', 'keeper_db_dir3': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper3/coordination'} stored in /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env (cluster.py:96, _create_env_file) 2025-11-13 08:32:50.135000 [ 649 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:50.135000 [ 649 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:50.135000 [ 649 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:50.136000 [ 649 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:50.143000 [ 649 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-11-13 08:32:50.144000 [ 649 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw2 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/docker-compose.yml pull] (cluster.py:121, run_and_check) 2025-11-13 08:33:00.590000 [ 649 ] DEBUG : Stderr: zoo2 Skipped - Image is already being pulled by zoo1 (cluster.py:147, run_and_check) 2025-11-13 08:33:00.590000 [ 649 ] DEBUG : Stderr: zoo3 Skipped - Image is already being pulled by zoo1 (cluster.py:147, run_and_check) 2025-11-13 08:33:00.590000 [ 649 ] DEBUG : Stderr: new_node Skipped - Image is already being pulled by zoo1 (cluster.py:147, run_and_check) 2025-11-13 08:33:00.590000 [ 649 ] DEBUG : Stderr: old_node Pulling (cluster.py:147, run_and_check) 2025-11-13 08:33:00.590000 [ 649 ] DEBUG : Stderr: zoo1 Pulling (cluster.py:147, run_and_check) 2025-11-13 08:33:00.591000 [ 649 ] DEBUG : Stderr: old_node Pulled (cluster.py:147, run_and_check) 2025-11-13 08:33:00.591000 [ 649 ] DEBUG : Stderr: zoo1 Pulled (cluster.py:147, run_and_check) 2025-11-13 08:33:00.591000 [ 649 ] DEBUG : Setup ZooKeeper (cluster.py:2799, start) 2025-11-13 08:33:00.591000 [ 649 ] DEBUG : Creating internal ZooKeeper dirs: ['/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper1/log', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper1/config', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper1/coordination', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper2/log', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper2/config', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper2/coordination', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper3/log', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper3/config', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/keeper3/coordination'] (cluster.py:2800, start) 2025-11-13 08:33:00.593000 [ 649 ] DEBUG : Command:[docker compose --project-name roottestbackuprestoreonclusterdifferentversions-gw2 --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --verbose up -d] (cluster.py:121, run_and_check) 2025-11-13 08:33:01.499000 [ 649 ] DEBUG : Stderr:time="2025-11-13T08:33:00Z" level=trace msg="Docker Desktop integration not enabled" (cluster.py:147, run_and_check) 2025-11-13 08:33:01.499000 [ 649 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw2_default Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw2_default Created (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:33:01.500000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:33:01.501000 [ 649 ] DEBUG : Stderr:time="2025-11-13T08:33:01Z" level=debug msg="otel error" error="" (cluster.py:147, run_and_check) 2025-11-13 08:33:01.502000 [ 649 ] DEBUG : Stderr:time="2025-11-13T08:33:01Z" level=debug msg="otel error" error="" (cluster.py:147, run_and_check) 2025-11-13 08:33:01.502000 [ 649 ] DEBUG : Wait ZooKeeper to start (cluster.py:2436, wait_zookeeper_to_start) 2025-11-13 08:33:01.502000 [ 649 ] DEBUG : get_instance_ip instance_name=zoo1 (cluster.py:2005, get_instance_ip) 2025-11-13 08:33:01.506000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.507000 [ 649 ] DEBUG : get_kazoo_client: zoo1, ip:172.16.2.3, port:2181, use_ssl:False (cluster.py:3312, get_kazoo_client) 2025-11-13 08:33:01.508000 [ 649 ] INFO : Connecting to 172.16.2.3(172.16.2.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:01.509000 [ 649 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:33:01.618000 [ 649 ] INFO : Connecting to 172.16.2.3(172.16.2.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:01.619000 [ 649 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:33:01.810000 [ 649 ] INFO : Connecting to 172.16.2.3(172.16.2.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:01.811000 [ 649 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:33:02.284000 [ 649 ] INFO : Connecting to 172.16.2.3(172.16.2.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:02.285000 [ 649 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:33:03.047000 [ 649 ] INFO : Connecting to 172.16.2.3(172.16.2.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:03.048000 [ 649 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:33:04.832000 [ 649 ] INFO : Connecting to 172.16.2.3(172.16.2.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:04.832000 [ 649 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=30000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-11-13 08:33:04.838000 [ 649 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-11-13 08:33:04.838000 [ 649 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-11-13 08:33:04.839000 [ 649 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-11-13 08:33:04.839000 [ 649 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-11-13 08:33:04.842000 [ 649 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-11-13 08:33:04.843000 [ 649 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-11-13 08:33:04.843000 [ 649 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-11-13 08:33:04.929000 [ 649 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-11-13 08:33:04.929000 [ 649 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-11-13 08:33:04.929000 [ 649 ] DEBUG : get_instance_ip instance_name=zoo2 (cluster.py:2005, get_instance_ip) 2025-11-13 08:33:04.932000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:04.932000 [ 649 ] DEBUG : get_kazoo_client: zoo2, ip:172.16.2.2, port:2181, use_ssl:False (cluster.py:3312, get_kazoo_client) 2025-11-13 08:33:04.933000 [ 649 ] INFO : Connecting to 172.16.2.2(172.16.2.2):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:04.933000 [ 649 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=30000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-11-13 08:33:04.937000 [ 649 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-11-13 08:33:04.937000 [ 649 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-11-13 08:33:04.938000 [ 649 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-11-13 08:33:04.938000 [ 649 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-11-13 08:33:04.941000 [ 649 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-11-13 08:33:04.941000 [ 649 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-11-13 08:33:04.941000 [ 649 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-11-13 08:33:05.042000 [ 649 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-11-13 08:33:05.042000 [ 649 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-11-13 08:33:05.042000 [ 649 ] DEBUG : get_instance_ip instance_name=zoo3 (cluster.py:2005, get_instance_ip) 2025-11-13 08:33:05.045000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.045000 [ 649 ] DEBUG : get_kazoo_client: zoo3, ip:172.16.2.4, port:2181, use_ssl:False (cluster.py:3312, get_kazoo_client) 2025-11-13 08:33:05.046000 [ 649 ] INFO : Connecting to 172.16.2.4(172.16.2.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:33:05.047000 [ 649 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=30000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-11-13 08:33:05.051000 [ 649 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-11-13 08:33:05.052000 [ 649 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-11-13 08:33:05.052000 [ 649 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-11-13 08:33:05.053000 [ 649 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-11-13 08:33:05.055000 [ 649 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-11-13 08:33:05.055000 [ 649 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-11-13 08:33:05.055000 [ 649 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-11-13 08:33:05.156000 [ 649 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-11-13 08:33:05.156000 [ 649 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-11-13 08:33:05.156000 [ 649 ] DEBUG : All instances of ZooKeeper started: ('zoo1', 'zoo2', 'zoo3') (cluster.py:2452, wait_zookeeper_nodes_to_start) 2025-11-13 08:33:05.157000 [ 649 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw2 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/docker-compose.yml up -d --no-recreate') (cluster.py:3139, start) 2025-11-13 08:33:05.157000 [ 649 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw2 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/docker-compose.yml up -d --no-recreate] (cluster.py:121, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Running (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Running (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Running (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:33:05.581000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:33:05.582000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:33:05.582000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:33:05.582000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:33:05.582000 [ 649 ] DEBUG : ClickHouse instance created (cluster.py:3147, start) 2025-11-13 08:33:05.582000 [ 649 ] DEBUG : get_instance_ip instance_name=new_node (cluster.py:2005, get_instance_ip) 2025-11-13 08:33:05.583000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.584000 [ 649 ] DEBUG : get_instance_ip instance_name=new_node (cluster.py:2015, get_instance_global_ipv6) 2025-11-13 08:33:05.585000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.586000 [ 649 ] DEBUG : Waiting for ClickHouse start in new_node, ip: 172.16.2.6... (cluster.py:3155, start) 2025-11-13 08:33:05.588000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.589000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.692000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.795000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:05.898000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.000000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.103000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.206000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.308000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.411000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.514000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.616000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.719000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.823000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:06.926000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.029000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.132000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.235000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/02c8e5f105dbd5fb5236cc984971ae3690914a5404ad4e8daefb7b10819d19bb/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.235000 [ 649 ] DEBUG : ClickHouse new_node started (cluster.py:3159, start) 2025-11-13 08:33:07.236000 [ 649 ] DEBUG : get_instance_ip instance_name=old_node (cluster.py:2005, get_instance_ip) 2025-11-13 08:33:07.237000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.237000 [ 649 ] DEBUG : get_instance_ip instance_name=old_node (cluster.py:2015, get_instance_global_ipv6) 2025-11-13 08:33:07.239000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.239000 [ 649 ] DEBUG : Waiting for ClickHouse start in old_node, ip: 172.16.2.5... (cluster.py:3155, start) 2025-11-13 08:33:07.241000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.243000 [ 649 ] DEBUG : http://localhost:None "GET /v1.46/containers/dc6cdd395120010dcf5ce859811ff237e46515d60dad314d314116e8df9968fc/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:07.243000 [ 649 ] DEBUG : ClickHouse old_node started (cluster.py:3159, start) ----------------------------- Captured stdout call ----------------------------- Using new_node as initiator for BACKUP Using new_node as initiator for RESTORE ------------------------------ Captured log call ------------------------------- 2025-11-13 08:33:07.245000 [ 649 ] DEBUG : Executing query CREATE TABLE tbl ON CLUSTER 'cluster_ver' (x UInt64) ENGINE=ReplicatedMergeTree('/clickhouse/tables/tbl/', '{replica}') ORDER BY tuple() on new_node (cluster.py:3648, query) 2025-11-13 08:33:07.761000 [ 649 ] DEBUG : Executing query INSERT INTO tbl VALUES (1) on new_node (cluster.py:3648, query) 2025-11-13 08:33:07.977000 [ 649 ] DEBUG : Executing query INSERT INTO tbl VALUES (2) on old_node (cluster.py:3648, query) 2025-11-13 08:33:08.193000 [ 649 ] DEBUG : Executing query BACKUP TABLE tbl ON CLUSTER 'cluster_ver' TO Disk('backups', '1') on new_node (cluster.py:3648, query) 2025-11-13 08:33:08.860000 [ 649 ] DEBUG : Executing query DROP TABLE tbl ON CLUSTER 'cluster_ver' SYNC on new_node (cluster.py:3648, query) 2025-11-13 08:33:09.277000 [ 649 ] DEBUG : Executing query RESTORE TABLE tbl ON CLUSTER 'cluster_ver' FROM Disk('backups', '1') on new_node (cluster.py:3648, query) 2025-11-13 08:33:09.945000 [ 649 ] DEBUG : Executing query SYSTEM SYNC REPLICA ON CLUSTER 'cluster_ver' tbl on new_node (cluster.py:3648, query) 2025-11-13 08:33:10.312000 [ 649 ] DEBUG : Executing query SELECT * FROM tbl ORDER BY x on new_node (cluster.py:3648, query) 2025-11-13 08:33:10.579000 [ 649 ] DEBUG : Executing query SELECT * FROM tbl ORDER BY x on old_node (cluster.py:3648, query) 2025-11-13 08:33:10.845000 [ 649 ] DEBUG : Executing query SELECT name, last_error_message FROM system.errors WHERE NOT ((name == 'NO_ELEMENTS_IN_CONFIG')) on new_node (cluster.py:3648, query) ---------------------------- Captured log teardown ----------------------------- 2025-11-13 08:33:11.130000 [ 649 ] DEBUG : Executing query DROP TABLE IF EXISTS tbl ON CLUSTER 'cluster_ver' SYNC on new_node (cluster.py:3648, query) 2025-11-13 08:33:11.496000 [ 649 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw2 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/docker-compose.yml stop --timeout 20] (cluster.py:121, run_and_check) 2025-11-13 08:33:19.689000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:19.689000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:19.689000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:19.689000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:19.689000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:19.690000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:19.690000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:19.690000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:19.690000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:19.690000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:19.690000 [ 649 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-11-13 08:33:19.703000 [ 649 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-11-13 08:33:19.717000 [ 649 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw2 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw2/old_node/docker-compose.yml down --volumes] (cluster.py:121, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-new_node-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-old_node-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:20.302000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo1-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo2-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw2-zoo3-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw2_default Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:20.303000 [ 649 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw2_default Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:20.304000 [ 649 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:33:20.321000 [ 649 ] DEBUG : Docker networks for project roottestbackuprestoreonclusterdifferentversions-gw2 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:33:20.346000 [ 649 ] DEBUG : Docker containers for project roottestbackuprestoreonclusterdifferentversions-gw2 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:33:20.366000 [ 649 ] DEBUG : Docker volumes for project roottestbackuprestoreonclusterdifferentversions-gw2 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:33:20.367000 [ 649 ] DEBUG : Command:[docker container list --all --filter name='^/roottestbackuprestoreonclusterdifferentversions-gw2-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:33:20.396000 [ 649 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:33:20.396000 [ 649 ] DEBUG : No running containers for project: roottestbackuprestoreonclusterdifferentversions-gw2 (cluster.py:879, cleanup) 2025-11-13 08:33:20.396000 [ 649 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:33:20.423000 [ 649 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:33:20.423000 [ 649 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:33:20.454000 [ 649 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-11-13 08:33:20.455000 [ 649 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-11-13 08:33:20.455000 [ 649 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:33:20.455000 [ 649 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:33:20.482000 [ 649 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:33:20.483000 [ 649 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) __________________________ test_complex_table_schema ___________________________ [gw0] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_complex_table_schema(started_cluster): node1 = started_cluster.instances['node1'] execute_spark_query(node1, "CREATE SCHEMA schema_with_complex_tables", ignore_exit_code=True) schema = "event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT" create_query = f"CREATE TABLE schema_with_complex_tables.complex_table ({schema}) using Delta location '/tmp/complex_schema/complex_table'" execute_spark_query(node1, create_query, ignore_exit_code=True) execute_spark_query(node1, "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\\\"f1\\\", 34, \\\"f2\\\", 'hello')", ignore_exit_code=True) node1.query("create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) complex_schema_tables = list(sorted(node1.query("SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) assert len(complex_schema_tables) == 1 > print(node1.query("SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table`")) test_database_delta/test.py:125: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3649: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 86, stderr: Received exception from server (version 25.3.8): E Code: 86. DB::Exception: Received from 172.16.3.2:9000. DB::HTTPException. DB::HTTPException: Received error from remote server http://localhost:8080/api/2.1/unity-catalog/tables/unity.schema_with_complex_tables.complex_table. HTTP status code: 404 'Not Found', body length: 182 bytes, body: '{"error_code":"NOT_FOUND","details":[{"reason":"NOT_FOUND","metadata":{},"@type":"google.rpc.ErrorInfo"}],"stack_trace":null,"message":"Schema not found: schema_with_complex_tables"}': while parsing JSON: . Stack trace: E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:113: Poco::Exception::Exception(String const&, int) @ 0x000000003833d451 E 1. ./build_docker/./src/Common/Exception.cpp:108: DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x000000001bd6da31 E 2. DB::Exception::Exception(PreformattedMessage&&, int) @ 0x000000000c3a520b E 3. ./src/Common/Exception.h:130: DB::HTTPException::makeExceptionMessage(int, String const&, Poco::Net::HTTPResponse::HTTPStatus, String const&, String const&) @ 0x000000001c51995e E 4. ./src/IO/HTTPCommon.h:33: DB::HTTPException::HTTPException(int, String const&, Poco::Net::HTTPResponse::HTTPStatus, String const&, String const&) @ 0x000000001c519ce9 E 5. ./build_docker/./src/IO/HTTPCommon.cpp:93: DB::assertResponseIsOk(String const&, Poco::Net::HTTPResponse&, std::basic_istream>&, bool) @ 0x000000001c51962b E 6. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:277: DB::ReadWriteBufferFromHTTP::callImpl(Poco::Net::HTTPResponse&, String const&, std::optional const&, bool) const @ 0x0000000021220873 E 7. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:285: DB::ReadWriteBufferFromHTTP::callWithRedirects(Poco::Net::HTTPResponse&, String const&, std::optional const&) @ 0x0000000021220dbc E 8. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:408: DB::ReadWriteBufferFromHTTP::initialize() @ 0x0000000021221e3b E 9. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:472: void std::__function::__policy_invoker::__call_impl[abi:ne190107]>(std::__function::__policy_storage const*) @ 0x0000000021227758 E 10. ./contrib/llvm-project/libcxx/include/__functional/function.h:716: ? @ 0x000000002121c7b1 E 11. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:465: DB::ReadWriteBufferFromHTTP::nextImpl() @ 0x0000000021224463 E 12. DB::ReadBuffer::next() @ 0x000000000c5e320b E 13. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:254: DB::ReadWriteBufferFromHTTP::ReadWriteBufferFromHTTP(DB::HTTPConnectionGroupType const&, Poco::URI const&, String const&, DB::ProxyConfiguration, DB::ReadSettings, DB::ConnectionTimeouts, Poco::Net::HTTPBasicCredentials const&, DB::RemoteHostFilter const*, unsigned long, unsigned long, std::function>&)>, bool, bool, std::vector>, bool, std::optional) @ 0x000000002121fb24 E 14. ./contrib/llvm-project/libcxx/include/__memory/unique_ptr.h:634: std::__unique_if::__unique_single std::make_unique[abi:ne190107]>&)>&, bool&, bool&, std::vector>&, bool&, std::nullopt_t const&>(DB::HTTPConnectionGroupType&, Poco::URI&, String&, DB::ProxyConfiguration&, DB::ReadSettings&, DB::ConnectionTimeouts&, Poco::Net::HTTPBasicCredentials const&, DB::RemoteHostFilter const*&, unsigned long&, unsigned long&, std::function>&)>&, bool&, bool&, std::vector>&, bool&, std::nullopt_t const&) @ 0x000000002120b208 E 15. ./src/IO/ReadWriteBufferFromHTTP.h:248: DataLake::createReadBuffer(String const&, std::shared_ptr, Poco::Net::HTTPBasicCredentials const&, std::vector, std::allocator>> const&, std::vector> const&, String const&, std::function>&)>) @ 0x000000002848538a E 16. ./build_docker/./src/Databases/DataLake/HTTPBasedCatalogUtils.cpp:50: DataLake::makeHTTPRequestAndReadJSON(String const&, std::shared_ptr, Poco::Net::HTTPBasicCredentials const&, std::vector, std::allocator>> const&, std::vector> const&, String const&, std::function>&)>) @ 0x0000000028485b35 E 17. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:55: DataLake::UnityCatalog::getJSONRequest(String const&, std::vector, std::allocator>> const&) const @ 0x0000000028472bae E 18. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:146: DataLake::UnityCatalog::tryGetTableMetadata(String const&, String const&, DataLake::TableMetadata&) const @ 0x00000000284796ba E 19. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:96: DataLake::UnityCatalog::getTableMetadata(String const&, String const&, DataLake::TableMetadata&) const @ 0x0000000028478cb0 E 20. ./build_docker/./src/Databases/DataLake/DatabaseDataLake.cpp:489: DB::DatabaseDataLake::getCreateTableQueryImpl(String const&, std::shared_ptr, bool) const @ 0x000000002842eed3 E 21. ./src/Databases/IDatabase.h:357: DB::InterpreterShowCreateQuery::executeImpl() @ 0x000000002acb98af E 22. ./build_docker/./src/Interpreters/InterpreterShowCreateQuery.cpp:34: DB::InterpreterShowCreateQuery::execute() @ 0x000000002acb8f3f E 23. ./build_docker/./src/Interpreters/executeQuery.cpp:1457: DB::executeQueryImpl(char const*, char const*, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum, DB::ReadBuffer*, std::shared_ptr&) @ 0x000000002ac0a883 E 24. ./build_docker/./src/Interpreters/executeQuery.cpp:1624: DB::executeQuery(String const&, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum) @ 0x000000002ac027a5 E 25. ./build_docker/./src/Server/TCPHandler.cpp:664: DB::TCPHandler::runImpl() @ 0x000000002fd87c16 E 26. ./build_docker/./src/Server/TCPHandler.cpp:2629: DB::TCPHandler::run() @ 0x000000002fdc78fa E 27. ./build_docker/./base/poco/Net/src/TCPServerConnection.cpp:40: Poco::Net::TCPServerConnection::start() @ 0x000000003851884f E 28. ./build_docker/./base/poco/Net/src/TCPServerDispatcher.cpp:115: Poco::Net::TCPServerDispatcher::run() @ 0x0000000038519517 E 29. ./build_docker/./base/poco/Foundation/src/ThreadPool.cpp:205: Poco::PooledThread::run() @ 0x0000000038427d6b E 30. ./base/poco/Foundation/src/Thread_POSIX.cpp:335: Poco::ThreadImpl::runnableEntry(void*) @ 0x0000000038421968 E 31. asan_thread_start(void*) @ 0x000000000c357e77 E . (RECEIVED_ERROR_FROM_REMOTE_IO_SERVER) E (query: SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table`) helpers/client.py:248: QueryRuntimeException ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-11-13 08:32:49.815000 [ 643 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:49.833000 [ 643 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:49.834000 [ 643 ] DEBUG : No running containers (conftest.py:95, cleanup_environment) 2025-11-13 08:32:49.834000 [ 643 ] DEBUG : Pruning Docker networks (conftest.py:97, cleanup_environment) 2025-11-13 08:32:49.834000 [ 643 ] DEBUG : Command:[docker network prune --force] (cluster.py:121, run_and_check) 2025-11-13 08:32:49.860000 [ 643 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:121, run_and_check) 2025-11-13 08:32:49.863000 [ 643 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:145, run_and_check) 2025-11-13 08:32:49.863000 [ 643 ] DEBUG : ENV DOCKER_KERBEROS_KDC_TAG 9391ecdee8d7 (cluster.py:424, __init__) 2025-11-13 08:32:49.863000 [ 643 ] DEBUG : ENV CLICKHOUSE_TESTS_SERVER_BIN_PATH /clickhouse (cluster.py:424, __init__) 2025-11-13 08:32:49.863000 [ 643 ] DEBUG : ENV MSAN_OPTIONS abort_on_error=1 poison_in_dtor=1 (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV JAVA_TOOL_OPTIONS -Djdk.attach.allowAttachSelf=true (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV TSAN_OPTIONS halt_on_error=1 abort_on_error=1 history_size=7 memory_limit_mb=46080 second_deadlock_stack=1 (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV HOSTNAME 0a121612553d (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV SHLVL 0 (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV HOME /root (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV OLDPWD / (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV DOCKER_HELPER_TAG 5dc43a6382f0 (cluster.py:424, __init__) 2025-11-13 08:32:49.864000 [ 643 ] DEBUG : ENV PYTHONUNBUFFERED 1 (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV DOCKER_PYTHON_BOTTLE_TAG d862517635bf (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV UBSAN_OPTIONS print_stacktrace=1 (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV PYTEST_ADDOPTS --dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV COMPOSE_HTTP_TIMEOUT 600 (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV DOCKER_MYSQL_PHP_CLIENT_TAG 88be89c1e3b6 (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV DOCKER_DOTNET_CLIENT_TAG 11de0b29a15d (cluster.py:424, __init__) 2025-11-13 08:32:49.865000 [ 643 ] DEBUG : ENV CLICKHOUSE_TESTS_CLIENT_BIN_PATH /clickhouse (cluster.py:424, __init__) 2025-11-13 08:32:49.866000 [ 643 ] DEBUG : ENV DOCKER_MYSQL_JS_CLIENT_TAG 41ba7c2ec2a1 (cluster.py:424, __init__) 2025-11-13 08:32:49.866000 [ 643 ] DEBUG : ENV CLICKHOUSE_USE_OLD_ANALYZER 1 (cluster.py:424, __init__) 2025-11-13 08:32:49.866000 [ 643 ] DEBUG : ENV PATH /spark-3.3.2-bin-hadoop3/bin:/opt/gdb/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin (cluster.py:424, __init__) 2025-11-13 08:32:49.866000 [ 643 ] DEBUG : ENV DOCKER_KERBERIZED_HADOOP_TAG latest (cluster.py:424, __init__) 2025-11-13 08:32:49.866000 [ 643 ] DEBUG : ENV DOCKER_CHANNEL stable (cluster.py:424, __init__) 2025-11-13 08:32:49.866000 [ 643 ] DEBUG : ENV DOCKER_CLIENT_TIMEOUT 300 (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV DOCKER_POSTGRESQL_JAVA_CLIENT_TAG a4eff5c7f4d6 (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV DOCKER_NGINX_DAV_TAG b55ac9cd7519 (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV DOCKER_MYSQL_GOLANG_CLIENT_TAG 9bec2a638e6e (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV PWD /ClickHouse/tests/integration (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV DOCKER_MYSQL_JAVA_CLIENT_TAG 766bff31cfe4 (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV CLICKHOUSE_TESTS_BASE_CONFIG_DIR /clickhouse-config (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV TZ Etc/UTC (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV JAVA_PATH /usr/lib/jvm/java-11-openjdk-amd64/bin/java (cluster.py:424, __init__) 2025-11-13 08:32:49.867000 [ 643 ] DEBUG : ENV DOCKER_BASE_TAG 5ccda723c1fc (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV SPARK_HOME /spark-3.3.2-bin-hadoop3 (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV LC_CTYPE C.UTF-8 (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV INTEGRATION_TESTS_RUN_ID 1 (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV WORKER_FREE_PORTS 30000 30001 30002 30003 30004 30005 30006 30007 30008 30009 30010 30011 30012 30013 30014 30015 30016 30017 30018 30019 30020 30021 30022 30023 30024 30025 30026 30027 30028 30029 30030 30031 30032 30033 30034 30035 30036 30037 30038 30039 30040 30041 30042 30043 30044 30045 30046 30047 30048 30049 (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV PYTEST_XDIST_TESTRUNUID 45317bcafab547c2a01f1d9a2a119b29 (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV PYTEST_XDIST_WORKER gw0 (cluster.py:424, __init__) 2025-11-13 08:32:49.868000 [ 643 ] DEBUG : ENV PYTEST_XDIST_WORKER_COUNT 10 (cluster.py:424, __init__) 2025-11-13 08:32:49.869000 [ 643 ] DEBUG : ENV PYTEST_CURRENT_TEST test_database_delta/test.py::test_complex_table_schema (setup) (cluster.py:424, __init__) 2025-11-13 08:32:49.869000 [ 643 ] DEBUG : CLUSTER INIT base_config_dir:/clickhouse-config (cluster.py:724, __init__) 2025-11-13 08:32:49.870000 [ 643 ] DEBUG : clickhouse_start_command: clickhouse server --config-file=/etc/clickhouse-server/{main_config_file} --log-file=/var/log/clickhouse-server/clickhouse-server.log --errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log (cluster.py:1662, add_instance) 2025-11-13 08:32:49.870000 [ 643 ] DEBUG : Cluster name: project_name:roottestdatabasedelta-gw0. Added instance name:node1 tag:latest base_cmd:['docker', 'compose', '--env-file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env', '--project-name', 'roottestdatabasedelta-gw0', '--file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml'] docker_compose_yml_dir:/ClickHouse/tests/integration/helpers/../../../tests/integration/compose/ (cluster.py:1948, add_instance) 2025-11-13 08:32:49.870000 [ 643 ] INFO : Starting cluster... (test.py:42, started_cluster) 2025-11-13 08:32:49.870000 [ 643 ] INFO : Running tests in /ClickHouse/tests/integration/test_database_delta/test.py (cluster.py:2738, start) 2025-11-13 08:32:49.871000 [ 643 ] DEBUG : Cluster start called. is_up=False (cluster.py:2745, start) 2025-11-13 08:32:49.893000 [ 643 ] DEBUG : Docker networks for project roottestdatabasedelta-gw0 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:49.918000 [ 643 ] DEBUG : Docker containers for project roottestdatabasedelta-gw0 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:49.939000 [ 643 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw0 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:49.939000 [ 643 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:32:49.969000 [ 643 ] DEBUG : Docker networks for project roottestdatabasedelta-gw0 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:49.992000 [ 643 ] DEBUG : Docker containers for project roottestdatabasedelta-gw0 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:50.017000 [ 643 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw0 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:50.017000 [ 643 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw0-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:32:50.044000 [ 643 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:32:50.044000 [ 643 ] DEBUG : No running containers for project: roottestdatabasedelta-gw0 (cluster.py:879, cleanup) 2025-11-13 08:32:50.044000 [ 643 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:32:50.072000 [ 643 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:32:50.073000 [ 643 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:32:50.096000 [ 643 ] DEBUG : Stderr:Error response from daemon: a prune operation is already running (cluster.py:147, run_and_check) 2025-11-13 08:32:50.097000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:50.097000 [ 643 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:32:50.097000 [ 643 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:50.125000 [ 643 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:50.125000 [ 643 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) 2025-11-13 08:32:50.126000 [ 643 ] DEBUG : Setup directory for instance: node1 (cluster.py:2758, start) 2025-11-13 08:32:50.126000 [ 643 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-11-13 08:32:50.127000 [ 643 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-11-13 08:32:50.127000 [ 643 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-11-13 08:32:50.128000 [ 643 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-11-13 08:32:50.128000 [ 643 ] DEBUG : Copy custom test config files [] to /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/configs/config.d (cluster.py:4741, create_dir) 2025-11-13 08:32:50.128000 [ 643 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/database (cluster.py:4758, create_dir) 2025-11-13 08:32:50.129000 [ 643 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/logs (cluster.py:4769, create_dir) 2025-11-13 08:32:50.129000 [ 643 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-11-13 08:32:50.129000 [ 643 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw'} stored in /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env (cluster.py:96, _create_env_file) 2025-11-13 08:32:50.130000 [ 643 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:50.130000 [ 643 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:50.130000 [ 643 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:50.130000 [ 643 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:50.141000 [ 643 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-11-13 08:32:50.142000 [ 643 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml pull] (cluster.py:121, run_and_check) 2025-11-13 08:33:00.655000 [ 643 ] DEBUG : Stderr: node1 Pulling (cluster.py:147, run_and_check) 2025-11-13 08:33:00.656000 [ 643 ] DEBUG : Stderr: node1 Pulled (cluster.py:147, run_and_check) 2025-11-13 08:33:00.656000 [ 643 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml up -d --no-recreate') (cluster.py:3139, start) 2025-11-13 08:33:00.656000 [ 643 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml up -d --no-recreate] (cluster.py:121, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Created (cluster.py:147, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:33:01.577000 [ 643 ] DEBUG : ClickHouse instance created (cluster.py:3147, start) 2025-11-13 08:33:01.578000 [ 643 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2005, get_instance_ip) 2025-11-13 08:33:01.579000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw0-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.580000 [ 643 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2015, get_instance_global_ipv6) 2025-11-13 08:33:01.581000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw0-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.582000 [ 643 ] DEBUG : Waiting for ClickHouse start in node1, ip: 172.16.3.2... (cluster.py:3155, start) 2025-11-13 08:33:01.583000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw0-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.585000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.687000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.790000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.892000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:01.996000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.099000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.204000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.308000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.412000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.517000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.621000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.725000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.828000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:02.932000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:03.036000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:03.139000 [ 643 ] DEBUG : http://localhost:None "GET /v1.46/containers/df212921d9877ad8755a2fc278caf8d4f4f08ea074621f83009d2c34d57c7482/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:33:03.140000 [ 643 ] DEBUG : ClickHouse node1 started (cluster.py:3159, start) 2025-11-13 08:33:03.140000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:False cmd: ['bash', '-c', 'cd /unitycatalog && nohup bin/start-uc-server &'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:03.141000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /unitycatalog && nohup bin/start-uc-server &] (cluster.py:121, run_and_check) ------------------------------ Captured log call ------------------------------- 2025-11-13 08:33:05.210000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:05.210000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:11.179000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:11.179000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:11.179000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:11.179000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:11.179000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-0e7f2f7b-030e-44e7-9e7d-f65cc5f1c9f6;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4242ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:33:11.180000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.181000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:11.182000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.183000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.184000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.185000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.186000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.187000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.188000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.189000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.190000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:11.191000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location \'/tmp/complex_schema/complex_table\'" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:11.192000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location '/tmp/complex_schema/complex_table'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:16.957000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:16.957000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:16.957000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:16.957000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:16.957000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-4572e556-6e06-4509-bc00-6b8989ca1e13;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.958000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4209ms :: artifacts dl 1ms (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:16.959000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.960000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.961000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.962000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.963000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.964000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.965000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.966000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.967000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.968000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.969000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.969000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.969000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.969000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.969000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.969000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.970000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.971000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:16.971000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.971000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.971000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.971000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.972000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.972000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.972000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.972000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.972000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.972000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.973000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.973000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:16.973000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.973000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.973000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.974000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.974000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.974000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.974000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.974000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.975000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.975000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.975000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.975000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.976000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.976000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:16.976000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.976000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:16.976000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.977000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.977000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.977000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:16.977000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.977000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.977000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.978000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.978000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.978000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.978000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.978000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:16.978000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.979000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:16.979000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.979000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:16.979000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.979000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.979000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.980000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.980000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:16.980000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:16.980000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:16.980000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.980000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.981000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.981000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.981000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.981000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.981000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.982000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.982000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.982000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:16.982000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:16.982000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date(\'2024-10-01\', \'yyyy-MM-dd\'), to_timestamp(\'2024-10-01 00:12:00\'), array(42, 123, 77), map(7, \'v7\', 5, \'v5\'), named_struct(\\"f1\\", 34, \\"f2\\", \'hello\')" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:16.982000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\"f1\", 34, \"f2\", 'hello')" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:22.880000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:22.880000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-9361ad15-99ae-4bf6-9d60-2d3bab20d0b4;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.881000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.882000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4226ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:22.883000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.884000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:22.885000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.886000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.887000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.888000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.889000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.890000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.890000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.890000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.890000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.890000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.890000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.891000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:22.891000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.891000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.891000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.891000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.891000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.892000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.892000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.892000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.892000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.892000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.892000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.893000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:22.893000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.893000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.893000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.893000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.893000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.894000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.894000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.894000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.894000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.894000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.895000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.895000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.895000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.895000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.895000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.896000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.896000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.896000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.896000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.896000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.896000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.897000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.898000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.899000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.900000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.901000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.901000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.901000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.901000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.901000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.902000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:22.902000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.902000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:22.902000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.902000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.902000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.903000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.904000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.904000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:22.904000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.904000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:22.904000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.904000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:22.905000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:22.906000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.906000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.906000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.906000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.906000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.906000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.907000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.907000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.907000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.907000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:22.907000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:22.907000 [ 643 ] DEBUG : Executing query create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-11-13 08:33:23.224000 [ 643 ] DEBUG : Executing query SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%' on node1 (cluster.py:3648, query) 2025-11-13 08:33:24.244000 [ 643 ] DEBUG : Executing query SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table` on node1 (cluster.py:3648, query) ______________________ test_embedded_database_and_tables _______________________ [gw0] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_embedded_database_and_tables(started_cluster): node1 = started_cluster.instances['node1'] node1.query("create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) default_tables = list(sorted(node1.query("SHOW TABLES FROM unity_test LIKE 'default%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) print("Default tables", default_tables) assert default_tables == ['default.marksheet', 'default.marksheet_uniform', 'default.numbers', 'default.user_countries'] for table in default_tables: if table == "default.marksheet_uniform": continue assert "DeltaLake" in node1.query(f"show create table unity_test.`{table}`") if table in ('default.marksheet', 'default.user_countries'): data_clickhouse = TSV(node1.query(f"SELECT * FROM unity_test.`{table}` ORDER BY 1,2,3")) > data_spark = TSV(execute_spark_query(node1, f"SELECT * FROM unity.{table} ORDER BY 1,2,3")) test_database_delta/test.py:90: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_database_delta/test.py:54: in execute_spark_query return node.exec_in_container( helpers/cluster.py:4117: in exec_in_container return self.cluster.exec_in_container( helpers/cluster.py:2069: in exec_in_container result = subprocess_check_call( helpers/cluster.py:239: in subprocess_check_call return run_and_check(args, detach=detach, nothrow=nothrow, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker', 'exec', 'roottestdatabasedelta-gw0-node1-1', 'bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql ...tCatalog=unity" \\\n -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v \'loading settings\'\n'] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args: Union[Sequence[str], str], env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ) -> str: if shell: if isinstance(args, str): shell_args = args else: shell_args = next(a for a in args) else: shell_args = " ".join(args) logging.debug("Command:[%s]", shell_args) if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return "" res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout, check=False, ) out = res.stdout.decode("utf-8", "ignore") err = res.stderr.decode("utf-8", "ignore") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug("Stdout:%s", outline) for errline in err.splitlines(): logging.debug("Stderr:%s", errline) if res.returncode != 0: logging.debug("Exitcode:%s", res.returncode) if env: logging.debug("Env:%s", env) if not nothrow: > raise Exception( f"Command [{shell_args}] return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command [docker exec roottestdatabasedelta-gw0-node1-1 bash -c E cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ E --master "local[*]" \ E --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ E --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ E --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ E --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ E --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ E --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ E --conf "spark.sql.catalog.unity.token=" \ E --conf "spark.sql.defaultCatalog=unity" \ E -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v 'loading settings' E ] return non-zero code 1: Ivy Default Cache set to: /root/.ivy2/cache E The jars for the packages stored in: /root/.ivy2/jars E org.apache.hadoop#hadoop-aws added as a dependency E io.delta#delta-spark_2.12 added as a dependency E io.unitycatalog#unitycatalog-spark_2.12 added as a dependency E :: resolving dependencies :: org.apache.spark#spark-submit-parent-c768d849-c4f9-485e-bb18-06b4ed14a6f5;1.0 E confs: [default] E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E :: resolution report :: resolve 4249ms :: artifacts dl 0ms E :: modules in use: E --------------------------------------------------------------------- E | | modules || artifacts | E | conf | number| search|dwnlded|evicted|| number|dwnlded| E --------------------------------------------------------------------- E | default | 3 | 0 | 0 | 0 || 0 | 0 | E --------------------------------------------------------------------- E E :: problems summary :: E :::: WARNINGS E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E module not found: org.apache.hadoop#hadoop-aws;3.3.4 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar E E ==== central: tried E E https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E module not found: io.delta#delta-spark_2.12;3.2.1 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar E E ==== central: tried E E https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar E E ==== central: tried E E https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E :::::::::::::::::::::::::::::::::::::::::::::: E E :: UNRESOLVED DEPENDENCIES :: E E :::::::::::::::::::::::::::::::::::::::::::::: E E :: org.apache.hadoop#hadoop-aws;3.3.4: not found E E :: io.delta#delta-spark_2.12;3.2.1: not found E E :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found E E :::::::::::::::::::::::::::::::::::::::::::::: E E E E :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS E Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] E at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) E at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) E at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) E at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) E at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) E at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) E at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) E at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) E at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) E at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) helpers/cluster.py:153: Exception ----------------------------- Captured stdout call ----------------------------- Default tables ['default.marksheet', 'default.marksheet_uniform', 'default.numbers', 'default.user_countries'] ------------------------------ Captured log call ------------------------------- 2025-11-13 08:33:24.763000 [ 643 ] DEBUG : Executing query create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-11-13 08:33:24.979000 [ 643 ] DEBUG : Executing query SHOW TABLES FROM unity_test LIKE 'default%' on node1 (cluster.py:3648, query) 2025-11-13 08:33:25.395000 [ 643 ] DEBUG : Executing query show create table unity_test.`default.marksheet` on node1 (cluster.py:3648, query) 2025-11-13 08:33:25.611000 [ 643 ] DEBUG : Executing query SELECT * FROM unity_test.`default.marksheet` ORDER BY 1,2,3 on node1 (cluster.py:3648, query) 2025-11-13 08:33:25.877000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:False cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:25.877000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-c768d849-c4f9-485e-bb18-06b4ed14a6f5;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:31.620000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4249ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:33:31.621000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.622000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.623000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.624000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.625000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.626000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.627000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.628000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:31.629000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.630000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.631000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:31.632000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:31.633000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) _________________________ test_multiple_schemes_tables _________________________ [gw0] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_multiple_schemes_tables(started_cluster): node1 = started_cluster.instances['node1'] execute_multiple_spark_queries(node1, [f'CREATE SCHEMA test_schema{i}' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'CREATE TABLE test_schema{i}.test_table{i} (col1 int, col2 double) using Delta location \'/tmp/test_schema{i}/test_table{i}\'' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'INSERT INTO test_schema{i}.test_table{i} VALUES ({i}, {i}.0)' for i in range(10)], True) node1.query("create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) multi_schema_tables = list(sorted(node1.query("SHOW TABLES FROM multi_schema_test LIKE 'test_schema%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) print(multi_schema_tables) for i, table in enumerate(multi_schema_tables): > assert node1.query(f"SELECT col1 FROM multi_schema_test.`{table}`").strip() == str(i) test_database_delta/test.py:107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3649: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 62, stderr: Code: 62. DB::Exception: Syntax error: failed at position 36 (``): ``. Expected one of: Colon, Caret, identifier, end of query. (SYNTAX_ERROR), Stack trace (when copying this message, always include the lines below): E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:113: Poco::Exception::Exception(String const&, int) @ 0x000000003833d451 E 1. ./build_docker/./src/Common/Exception.cpp:108: DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x000000001bd6da31 E 2. DB::Exception::createDeprecated(String const&, int, bool) @ 0x000000000c3fc730 E 3. ./build_docker/./src/Parsers/parseQuery.cpp:411: DB::parseQueryAndMovePosition(DB::IParser&, char const*&, char const*, String const&, bool, unsigned long, unsigned long, unsigned long) @ 0x0000000031baf4f5 E 4. ./build_docker/./src/Client/ClientBase.cpp:402: DB::ClientBase::parseQuery(char const*&, char const*, DB::Settings const&, bool) @ 0x000000002fa55369 E 5. ./build_docker/./src/Client/ClientBase.cpp:2369: DB::ClientBase::analyzeMultiQueryText(char const*&, char const*&, char const*, String&, std::shared_ptr&, String const&, std::unique_ptr>&) @ 0x000000002fa7c487 E 6. ./build_docker/./src/Client/ClientBase.cpp:2507: DB::ClientBase::executeMultiQuery(String const&) @ 0x000000002fa7ddb2 E 7. ./build_docker/./src/Client/ClientBase.cpp:2776: DB::ClientBase::processQueryText(String const&) @ 0x000000002fa8119e E 8. ./build_docker/./src/Client/ClientBase.cpp:3429: DB::ClientBase::runNonInteractive() @ 0x000000002fa9355b E 9. ./build_docker/./programs/client/Client.cpp:407: DB::Client::main(std::vector> const&) @ 0x000000001c24ccf7 E 10. ./build_docker/./base/poco/Util/src/Application.cpp:315: Poco::Util::Application::run() @ 0x000000003857ac57 E 11. ./build_docker/./programs/client/Client.cpp:1141: mainEntryClickHouseClient(int, char**) @ 0x000000001c263d89 E 12. ./build_docker/./programs/main.cpp:295: main @ 0x000000000c39489f E 13. ? @ 0x00007fcaa826fd90 E 14. ? @ 0x00007fcaa826fe40 E 15. _start @ 0x000000000c2bd02e helpers/client.py:248: QueryRuntimeException ----------------------------- Captured stdout call ----------------------------- [''] ------------------------------ Captured log call ------------------------------- 2025-11-13 08:33:31.779000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:31.779000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-c3e9be1f-013c-4007-bc07-1b87e1a6c321;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:37.808000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:37.809000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.809000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.809000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.809000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.810000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.810000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.810000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.810000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.810000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.810000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.811000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.811000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:37.811000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4343ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:33:37.811000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.811000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:37.812000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:37.812000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:37.812000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:37.812000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:37.812000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:37.812000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.813000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.813000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:37.813000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.813000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.813000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:37.814000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.815000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.816000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.817000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.817000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.817000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.817000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.817000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.817000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.818000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.819000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:37.820000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.821000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.822000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.823000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.824000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.825000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.826000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.827000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.828000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.829000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:37.830000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.831000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.832000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.833000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.833000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:37.833000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:37.833000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:37.833000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.833000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.834000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:37.835000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:37.835000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location \'/tmp/test_schema0/test_table0\';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location \'/tmp/test_schema1/test_table1\';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location \'/tmp/test_schema2/test_table2\';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location \'/tmp/test_schema3/test_table3\';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location \'/tmp/test_schema4/test_table4\';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location \'/tmp/test_schema5/test_table5\';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location \'/tmp/test_schema6/test_table6\';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location \'/tmp/test_schema7/test_table7\';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location \'/tmp/test_schema8/test_table8\';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location \'/tmp/test_schema9/test_table9\'" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:37.835000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location '/tmp/test_schema0/test_table0';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location '/tmp/test_schema1/test_table1';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location '/tmp/test_schema2/test_table2';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location '/tmp/test_schema3/test_table3';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location '/tmp/test_schema4/test_table4';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location '/tmp/test_schema5/test_table5';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location '/tmp/test_schema6/test_table6';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location '/tmp/test_schema7/test_table7';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location '/tmp/test_schema8/test_table8';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location '/tmp/test_schema9/test_table9'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:43.743000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:43.743000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:43.743000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:43.743000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:43.743000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-9f057979-7f2b-4f24-861f-4eb103612efe;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.744000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.745000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.745000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.745000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.745000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:43.745000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4236ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:33:43.745000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.746000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.747000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.748000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.749000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.750000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.751000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.752000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:43.753000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.754000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.755000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.756000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.757000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.758000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.759000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.759000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.759000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.759000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.760000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:43.761000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:43.762000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.763000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.764000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.765000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:43.765000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:43.765000 [ 643 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:33:43.765000 [ 643 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:49.656000 [ 643 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-eed6043f-f410-472c-895d-51f523048006;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.657000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr::: resolution report :: resolve 4216ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.658000 [ 643 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.659000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.660000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.661000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.662000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.663000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.664000 [ 643 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.665000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.666000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.667000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:49.668000 [ 643 ] DEBUG : Executing query create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-11-13 08:33:49.883000 [ 643 ] DEBUG : Executing query SHOW TABLES FROM multi_schema_test LIKE 'test_schema%' on node1 (cluster.py:3648, query) 2025-11-13 08:33:50.350000 [ 643 ] DEBUG : Executing query SELECT col1 FROM multi_schema_test.`` on node1 (cluster.py:3648, query) ---------------------------- Captured log teardown ----------------------------- 2025-11-13 08:33:50.980000 [ 643 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml stop --timeout 20] (cluster.py:121, run_and_check) 2025-11-13 08:33:56.596000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:56.596000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:56.596000 [ 643 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-11-13 08:33:56.611000 [ 643 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml down --volumes] (cluster.py:121, run_and_check) 2025-11-13 08:33:57.079000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:57.079000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:57.079000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:57.079000 [ 643 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:57.080000 [ 643 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:57.080000 [ 643 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:57.080000 [ 643 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:33:57.104000 [ 643 ] DEBUG : Docker networks for project roottestdatabasedelta-gw0 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:33:57.127000 [ 643 ] DEBUG : Docker containers for project roottestdatabasedelta-gw0 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:33:57.156000 [ 643 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw0 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:33:57.156000 [ 643 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw0-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:33:57.184000 [ 643 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:33:57.185000 [ 643 ] DEBUG : No running containers for project: roottestdatabasedelta-gw0 (cluster.py:879, cleanup) 2025-11-13 08:33:57.185000 [ 643 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:33:57.211000 [ 643 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:33:57.211000 [ 643 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:33:57.258000 [ 643 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-11-13 08:33:57.258000 [ 643 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-11-13 08:33:57.258000 [ 643 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:33:57.259000 [ 643 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:33:57.288000 [ 643 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:33:57.288000 [ 643 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) ----------------- generated report log file: parallel0_1.jsonl ----------------- ============================== slowest durations =============================== 19.35s call test_database_delta/test.py::test_complex_table_schema 19.09s call test_database_delta/test.py::test_multiple_schemes_tables 17.43s setup test_backup_restore_on_cluster/test_different_versions.py::test_different_versions 15.39s setup test_database_delta/test.py::test_complex_table_schema 12.94s setup test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 9.35s teardown test_backup_restore_on_cluster/test_different_versions.py::test_different_versions 6.87s call test_database_delta/test.py::test_embedded_database_and_tables 6.31s teardown test_database_delta/test.py::test_multiple_schemes_tables 4.94s teardown test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 3.82s call test_backup_restore_on_cluster/test_different_versions.py::test_different_versions 0.32s call test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 0.00s teardown test_database_delta/test.py::test_complex_table_schema 0.00s teardown test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_multiple_schemes_tables =========================== short test summary info ============================ FAILED test_backup_restore_on_cluster/test_different_versions.py::test_different_versions FAILED test_database_delta/test.py::test_complex_table_schema - helpers.clien... FAILED test_database_delta/test.py::test_embedded_database_and_tables - Excep... FAILED test_database_delta/test.py::test_multiple_schemes_tables - helpers.cl... SKIPPED [1] test_asynchronous_metric_jemalloc_profile_active/test.py:30: Disabled for sanitizers =================== 4 failed, 1 skipped in 69.52s (0:01:09) ==================== Traceback (most recent call last): File "/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration/./runner", line 492, in subprocess.check_call(cmd, shell=True, bufsize=0) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_hlyj5n --privileged --dns-search='.' --memory=30709030912 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e CLICKHOUSE_USE_OLD_ANALYZER=1 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:226bfaf75ac1 ' returned non-zero exit status 1.